perm filename BM[BMP,SYS] blob
sn#663340 filedate 1982-06-18 generic text, type C, neo UTF8
COMMENT ⊗ VALID 00039 PAGES
C REC PAGE DESCRIPTION
C00001 00001
C00005 00002 Plan for basic BM work
C00011 00003 Full BMT -- the theory with shells and wfn -- ala PRA -- notes
C00012 00004 BMA -- the language and propositional theory --
C00016 00005 BMA -- the shell principle
C00022 00006 More on Typesets, Type restricitons, etc.
C00026 00007 BMA -- Measured lexicographic ordering
C00027 00008 BMA -- the recursion principle
C00031 00009 BMA -- the induction principle
C00033 00010 PBM
C00034 00011 NBM -- the NUMBERP shell
C00037 00012 GROUND.ZERO -- NBM + LISTP + LITATOM + ...
C00043 00013 Interpreting BMA in NBM
C00047 00014 Reductions -- NBM recursion + SUB1P induction ==> NBM induction
C00052 00015 Reduction to standard form of recursion
C00053 00016
C00054 00017 Results from literature
C00057 00018
C00058 00019 Analysis of BM-recursive via interpreter
C00063 00020 BM recursion via finite approximations
C00066 00021 Forms of definition -- ACK examples --
C00071 00022 BMH Heuristics -- Notes on structures and strategies of BMTP
C00074 00023 Simplify-clause[Clause] <=
C00077 00024 Rewrite-term[atm,TSalist,Valist,hyps,mode,hopes,div] <=
C00083 00025 Assume-t-or-f[atm,TSalist] <=
C00089 00026 Possible Decision procedures buried in SIMPLIFY
C00091 00027 Machines
C00094 00028 General thoughts about TP data machine
C00096 00029 Notes on heuristics
C00098 00030 Notes on BM
C00103 00031 Some (more or less) standard notation for ordinals, schema, systems
C00105 00032 Summary of results (translated in to std notation)
C00106 00033 [From Tait68 Nested]
C00111 00034 Bibliography: Ordinal Recursion, Induction, and quantifier complexity in
C00127 00035 Other files
C00128 00036 Ref to get
C00129 00037 ∂19-Sep-81 1355 CLT
C00155 00038 Notes to send
C00156 00039 % ____________________________________________________________________ %
C00157 ENDMK
C⊗;
Plan for basic BM work
BMA -- Boyer - Moore `arithmetic'
I. Formulate the theory {T}+{F}+Shells*+FunDef+Ind{wfn,measure}
II. NBM -- the NUMBERP shell
III. BBM -- based on NUMBERP,CONSP,LITATOM shells -- GROUND.ZERO
IV. Interpretation of BMA in NBM
V. Reduction to Ind{LESSP}
VI. Reduction to standard form of (nested) recursion
VII. Remarks on choice of formulation
need for suitable rules for formulation and implementation of heuristics
restriction to propositions of form t ≠ F
Numerical BM
Relation to extensions of PRA
(1) Formulate PRA, PRA→ (with T,F,if) and show equivalence
(2) Let @I be family of PRWOs
PRA(@I) <::> PRA plus <<-UREC for << in @I
PRA(@I)# <::> PRA plus <<-NREC for << in @I
similarly for PRA→
Mappings PRA <--> PRA→ should lift uniformly
(3) Assume @I closed under *
BM(@I) is PRA→ with {<<,rho}-BMRec, {<<,rho}-BMInd
(for rho BM(@I) definable measure)
Theorem
PRA→(@I)# is subtheory of BM(@I)
PRA→(@I)# closed under {<<,rho}-BMrec
PRA→(@I)# closed under {<<,rho}-BMInd
(for rho PRA→(@I)# definable measure)
thus PRA(@I)# and BM(@I) are essentially equivalent
strongly
The WFO defined by function definiton --
subgoal induction/bar indution
Relation to fragments of PA (Parsons)
Let @F be family of formulas (of PA)
PA|@F-IR (Induction rule restricted to @F)
PA|@F-IA (Induction axiom restricted to @F)
For certain @I want a theorem of the form
(a) PA|@F-IR |- ∀x∃yA[x,y] ==> PRA(@I)# |- A[x,f[y]]
(b) f a PRA(@I)# function F the corresponding relation
PA|@F-IR |- ∀x∃yF[x,y]
ditto for IA or result that IR and IA agree on class of formulae considered
Relation to alternate formulations of classes of recursive functions
(1) peter
(2) majorizing hier (Wainer,Schwictenberg)
(3) kleene hier
(4) finite type (Tait, Schwictenberg, Howard)
[Detailed results for @I={<a | a=w\n n<w}
general results for @I ~ {<a | a < w{n}} and @I ~ {<a | a < e0}
]
Further work on BMT
(*) higher ordinals
(1) teach TP about e0,G0
(2) work out examples using larger ordinals in BMT
Interpreters for classes of rec funs
Interpreters for funs of higher type
Majoriz hierarchies (as funs of indices)
(3) consider additioal principles of construction of WFO's
ranking by measures on components of construction?
c = w\a + b
(*) a meta theorem about functions on OTM/EQ ?
[(EQ x y) -> (EQ (f x) (f y))] ∧ [(CNFP x) -> phi[f,x]] -> [(OTM x) -> phi[f,x]]
(*) imbedding wfos into OTM, constructions from LESSP via exp,+,*,'
(*) T-pred for BM
Write redunction opn; and encode BM{<<}defns and notion of comp state
st f is BM{<<} acceptable iff f[x]=It[red{ctxt},g[x],init['f`,x]] some
g computing the length of computation
(*) higher types
(*) quantifiers
what affect does adding quantifiers have on current BM heuristics
How do do heuristics extend or not in extensions 2-4?
(*) other principles -- partial reflection,FAC,AS,...
(*) Add decidable FIG (via char funs)?
(*) Add general principal of FIG + QF-CA (e.g. real class vars)
(*) Analysis of TP program
Termination?
Are there interesting decidable classes buried in the heuristics
Typesets+shell notion might give interesting class
Compare Oppen Decidable class to BM ``explicit-value''-terms
(*) Read Bledsoe papers
Full BMT -- the theory with shells and wfn -- ala PRA -- notes
Declarations of function type: btm,rec,cons,ac,wfn
Constructive histories (contexts)
Derivation wrt history/theorem of H
notion of well-formed term other `syntactic' notions relative to history
LESSP vs SUB1P form of BM
using reduction of BMinduction to BMrec+SUB1Pinduction
need only show BMrec reducible to SUB1P\k rec
reduction to LESSP+COUNT induction.lemmas
BMA -- the language and propositional theory --
Quantifier free theory shell structures and recursively defined functions
Language
Variables -- officially X0,X1,...
Function symbols -- a symbols for functions of each arity ≥0
including those given below and additional symbols introduced
according to the function definition rule.
Propositional functions --
TRUE:0,TRUEP:1
FALSE:0,FALSEP:1
IF:3,EQUAL:2
V,V0,... will stand for variables
f,g,h will stand for function symbols
Remark: Following the style of BM we will use upper case identifiers
for function symbols and specify the arity.
We will use BM names for function constants of the basic theory, but
terms and other expressions will be written in more standard mathematical
notation.
Terms -- t,t0,t1..
~~~~~~~~~~~~~~~~~~
Variables
f[t1,..,tk] if f has arity k
Notation: f[t] will also be written f t when not ambiguous
Formulas -- @A,@B,...
~~~~~~~~~~~~~~~~~~~~~
t0=t1
¬@A, @A->@B @A∧@B @A∨@B
Metanotions
~~~~~~~~~~~
Substitution s is given by a set of assignments {...,Vi<-ti,...}.
For a formula @A (term t) and substitution s, @A/s (t/s)
is the result of simultaneously replacing all occurrences of Vi in @A (t)
by ti (there need not be any) for each assignment Vi<-ti in s.
For a term t the corresponding proposition is defined as t≠FALSE[]
(e.g. t≠F see below)
Function symbols will be further classified according to the means
of introduction as will be discussed in preamble to principles for
introduction of functions and defining axioms
I. Logical axioms and rules.
~~~~~~~~~~~~~~~~~~~~~~~~~
A. (TAUT) Truthfunctional tautologies
B. (EQUALITY) Equality reasoning
C. (INSTANTIATION) @A // @A/s
D. (MP) @A; @A->@B // @B
Propositional theory
~~~~~~~~~~~~~~~~~~~~
Abbreviations:
T :: TRUE[]
F :: FALSE[].
Axioms:
T≠F
X≠F -> IF[X,Y,Z] = Y
X=F -> IF[X,Y,Z] = Z
X=Y -> EQUAL[X,Y] = T
X≠Y -> EQUAL[X,Y] = F
TRUEP[X] = EQUAL[X,T]
FALSEP[X] = EQUAL[X,F]
BMA -- the shell principle
The shell principle provides for definition of certian inductively
defined subdomains of the universe. This is accomplished by
introducing a collection of functions the describe the construction
of elements of the domain and an ordering relation corresponding
to the inductive generation that is asserted to be a WFN.
Recognizers, constructors, bottoms, accessors
e.g. we extend a context C to context C1 by adding axioms for shell functions
(and making declarations about function classifcations)
Informally we can think of the elements of a shell as tagged n-tuples
with different shells having different tags. Each component
may be restricted to belong one of a given collection of shells.
In addition a shell may have a `bottom' element.
To describe shell we need the notions of typeset, typeset restriction
Types are the classifications introduced by shell principle.
A type recognizer is either TRUEP,FALSEP,or a shell recognizer.
(e.g. declare the singletons T,F as shells with recognizers TRUEP,FALSEP.)
Let S{r} denote the elements of type r, -S{r} the elements not of type r.
A typeset has one of the forms
ONE.OF<r1...rk> - the union of a finite number of types - S{r1}∪...∪S{rk}
NONE.OF<r1...rk> - the complement of ONE.OF - -S{r1}∩...∩-S{rn}
The corresponding type restriction tr (on the variable X) is the term
IF[r1[X],T,...IF[rk[X],T,F] for ONE.OF
IF[r1[X],F,...IF[rk[X],F,T] for NONE.OF
The typeset then is thus {X | tr}
A typeset for the context C and new "r is one involving only "r and
type recognizers introduced in C
Assume new and distinct function symbols ) relative to C
(a) "const is n-ary function symbol
"btm is 0-ary function symbol (optional)
"r, "ac{1}..."ac{n}, 1-ary function symbols
"wfn 2-ary function symbol
(b.i) tr{i} typeset restriction for C and "r (on variable Xi) 1≤i≤n
(c.i : 1≤i≤n)
dv{i} is bottom from previous shell (or "btm[] if ∃btm)
C |- Xi=dv{i} ∧ "r["btm[]]≠F -> tr{i}≠F
("r["btm[]]≠F omitted if ¬∃btm)
then we may introduce shell "const of n arguments
with
bottom element "btm (if ∃btm)
recognizer "r
accessors "ac{1}..."ac{n}
type restrictions tr{1}...tr{n}
default values dv{1}...dv{n}
well-founded relation "wfn
and axioms (taking X=X for X≠"btm[] if ¬∃btm)
(1) "r[X]=T ∨ "r[X]=F
"r["const[X1...Xn]]≠F
"r["btm[]]≠F (∃btm)
"const[X1...Xn]≠"btm[] (∃btm)
"r[X]≠F ∧ X≠"btm[] -> "const["ac{1} X,..."ac{n} X]=X
For 1≤i≤n
(2i) tr{i}≠F -> "ac{i}["const[X1 ... Xn]]=Xi
For 1≤i≤n
(3i) "r[X]=F ∨ X="btm[] ∨ (tr{i}=F ∧ X="const[X1...Xn]) -> "aci[X]=dv{i}
(4) "r[T]=F
"r[F]=F
For each previous shell recognizer r'
(5r') "r[X]≠F -> r'[X]=F
Let w be well-founded relation of previous shell or FALSE2 if no previous shell
where (FALSE2 X Y)=F
(6) "wfn[X Y]≠F <->
w[X Y]≠F ∨
("r[Y]≠F ∧ Y≠"btm[] ∧
(X="ac{1}[Y]∨...∨ X="ac{n}[Y]) )
(7) declare WFO["wfn]
or
(6*) "wfn[X Y]≠F <->
w[X Y]≠F ∨
("r[Y]≠F ∧ Y≠"btm[] ∧
(X="ac{1}[Y]∨...∨ X="ac{n}[Y] ∨
"wfn[X,"ac{1}[Y]]∨...∨"wfn[X,"ac{n}[Y]]) )
(7*) declare WFO["wfn]
More on Typesets, Type restricitons, etc.
Types are the classifications introduced by shell principle.
A type recognizer is either TRUEP,FALSEP,or an shell recognizer.
(e.g. declare the singletons T,F as shells with recognizers TRUEP,FALSEP.)
Types are disjoint r[X]=T -> r'[X]=F for r,r' distinct type recognizers.
Let S{r} denote the elements of type r, -S{r} the elements not of type r.
A typeset has one of the forms
ONE.OF<r1...rk> - the union of a finite number of types - S{r1}∪...∪S{rk}
NONE.OF<r1...rk> - the complement of ONE.OF - -S{r1}∩...∩-S{rn}
Typesets can be expressed syntactically via terms called typeset
restrictions. Let X be a variable, r1..rk type recognizers.
A type restriction wrt {X,r1,...rn} is a formula constructed from
T,F,ri[X] using IF
Lemma if tr is a typerestriction wrt {X,r1,...rn} then
{X | tr=T } is a typeset
Note (1) by properties of recognizers tr=T ∨ tr=F
(2) by properties of IF and rec tr is equivalent to
disjuntion of terms(typers) of form c1=T∧...∧ci=T∧d1=F∧...∧dj=F
where each c and d is rm[X] 1≤m≤n
by disjointness wma assume k≤1 and c not among the d's
and using c=T∧d1=F∧...∧dj=F iff c=T the disjuncts have one of the forms
c=T or d1=F∧...∧dj=F
wma no positive r occurs in negative conjunct
r ∨ ¬r∧p ≡ r ∨ p
if there are any neg conjuncts then positive ones may be eliminated
c=T ∨ d=F holds iff d=F holds by reductions and disjointness
(while possibly c=F∧d=F)
thus wma that disjuncts are all of the form c=T (case 1 of typesets)
or all of the form d1=F∧...dk=F
in the latter case -- we reduce to 1 disjunct as follows --
by disjointnes
d1=F∧..∧dj=F∧e1=F∧..∧ek=F ∨ d1=F∧..∧dj=F∧f1=F∧..∧fk=F ≡ d1=F∧..∧dj=F
(where e's are distinct from d's and f's)
successive application of this reduces to 1 disjunct
(and thus to case 2 of typesets)
(there are degerate cases (TRUE)[some conjunct reduces to mt)
(FALSE) the entire disjunct reduces to mt)
Thus any tr over {X,r1,...rn} can be reduced to an equivalent (in BM)
formula of one of the forms
(TRUE)
(FALSE)
r{1}[X]=T ∨...∨ r{k}[X]=T
r{1}[X]=F ∧...∧ r{k}[X]=F
(the r{i} distinct members of {r1,...,rn}
Note that in shell definition the requirement that |- tri[dvi]
prevents empty shells.
BMA -- Measured lexicographic ordering
Let r be binary relation symbol. The k-ary lexicographic relation <{r,k}
induced by r is defined by (meta)induction on k as follows
Y <{r,1} X <-> r[Y,X]≠F
Y0,..,Yk <{r,k+1} X0,...,Xk <->
Y0 <{r,1} X0 ∨
Y0=X0 ∧ (Y1,..,Yk <{r,k} X1,...,Xk)
Let m_=m1,...mk be list of n-ary function symbols
Write m_[t1,...tn] for the list m1[t1,...,tn],...,mk[t1,...,tn]
The n-ary measured lexicographic relation <{r,m_}induced by r,m_
is defined by
Y1,...,Yn <{r,m_} X1,...,Xn <-> m_[Y1,...Yn] <{r,k} m_[X1,...,Xn]
BMA -- the recursion principle
In order to state the principle for introduction of recursively
defined functions we need an additional syntatic notion
- the f-machine for t (where f is a function symbol and t is a term).
Informally this is a sequence of all instances of subterms of t
of the form f[s_] together with a formula describing the
(f-independent) conditions implying that the value of the subterm
is needed to determine the value of t. (Recall that for a term, t
of the form IF[t0,t1,t2] the value of t1 determines the value
of t (independent of the value of t2) in the case t0≠F and
dually in the case t0=F)
Formally we define mach{f,t} by induction on construction of terms
as follows
t is variable or 0-ary =>
mach{f,t} := <> (the empty list)
t is f[t1...tn] =>
mach{f,t} := <(f[t1...tn],T=T)> * mach{f,t1} *...* mach{f,tn}
t is IF[t0,t1,t2] and f does not occur in t0 =>
mach{f,IF[t0,t1,t2]} := and(t0≠F,mach{f,t1}) * and(t0=F,mach{f,t2})
t is IF[t0,t1,t2] and f occurs in t0 =>
mach{f,IF[t0,t1,t2]} := mach{f,t0} * mach{f,t1} * mach{f,t2}
t is g[t1...tn] with g not f or IF =>
mach{f,g[t1...tn]} := mach{f,t1} *...* mach{f,tn}
where and(q,<...(si,qi)...>) = <...(si,qi∧q)...>)
Alternately mach{f,t} <::> mach1{f,t,T=T} were
t is f[t1,..,tn] =>
mach{f,t,Q} := <(t,Q)> * mach{f,t1,Q} *...* mach{f,tn,Q}
t is g[t1,..,tk] (g not f or IF) =>
mach{f,t,Q} := mach{f,t1,Q} *...* mach{f,tn,Q}
t is IF[t0,t1,t2] and f does not occur in t0 =>
mach{f,t,Q}= mach[f,t1,Q∧t0≠F] * mach[f,t2,Q∧t0=F]
t is IF[t0,t1,t2] and f occurs in t0 =>
mach{f,t,Q} := mach{f,t0,Q} * mach{f,t1,Q} * mach{f,t2,Q}
For (f[s_],@Q) in mach[f,t,<>] we call @Q the f-free test governing the
corresponding occurrence of f in t.
Let C be a context. We may introduce a function defined by recursion
if
(a) f is `new' n-ary function symbol
(b) x1...xn are distinct variables
(c) t a term constructed from old function symbols, f, and x1...xn
(d) for 1≤i≤l; f[t{i,1}...t{i,n}] are the occurrences of f in t
(e) for 1≤i≤l @Qi is the f-free test governing f[t{i,1}...t{i,n}] in t
(f) for 1≤i≤l si the substitution {x1 <- t{i,1}...xn <- t{i,n}}
(g) m1...mk n-ary function symbols
(h) r a WFO
If C |- @Qi -> [x1,...,xn]/si <{r,m_} [x1,...,xn] for 1≤i≤l
then we may introduce the function f with the defining axiom
f[x1 ... xn] = t
To emphasise the definitional aspect we may write
f[x1 ... xn] <= t
BMA -- the induction principle
Rule for Induction
~~~~~~~~~~~~~~~~~~
If
(a) @P be a formula (the proposition to be proved)
(b) x1,...xn be distinct variables
(c) m1...mk n-ary function symbols (m_ interpreted as above)
(d) r a WFO
(e) @Qi 1≤i≤l formulae (describing the l non-base cases)
(f) for 1≤i≤l; 1≤j≤hi s{i,j} is a substitution
(the jth instance of IH in ith case; hi the number of such instances)
(g) @Q0 <::> ¬@Q1 ∧ ... ∧ ¬@Ql (the base case)
then the induction rule is
from
@Qi -> [x1,...,xn]/s{i,j} <{r,m_} [x1,...,xn] for 1≤i≤l; 1≤j≤hi
@Q0 -> @P
@Qi ∧ @P/s{i,1} ∧ ... ∧ @P/s{i,hi} -> @P for 1≤i≤l
infer
@P
PBM
Let PBM0 be the context with language {T,F,TRUEP,FALSEP,IF,EQUAL}
the axioms PBM(1- ) declarations for TRUEP and FALSEP as recognizers
PBM is the extension of PBM0 by the definitions
NOT[X] <= IF[X,F,T]
AND[X,Y] <= IF[X,IF[Y,T,F],F]
OR[X,Y] <= IF[X,T,IF[Y,T,F]]
IMPLIES[X,Y] <= IF[X,IF[Y,T,F],T]
IFF[X,Y] <= IF[X,IF[Y,T,F],IF[Y,F,T]]
NBM -- the NUMBERP shell
NBM0 <::> PBM + NUMBER shell where
NUMBER shell is the shell
ADD1 of 1 argument
with
bottom element ZERO
recognizer NUMBERP
accessors SUB1
type restrictions tr{1}:= NUMBERP[X1]
default values dv{1}:= ZERO[]
well-founded relation SUB1P (LESSP)
We abbreviate ZERO[] to 0.
NBM1 <::> NBM0 + {ZEROP,FIX,PLUS,TIMES} where
ZEROP[X] <= OR[EQUAL[X,0], NOT[NUMBERP X]]
FIX[X] <= IF[NUMBERP X,X,0]
PLUS[X,Y] <= IF[ZEROP[X],FIX[Y],ADD1[PLUS[SUB1 X,Y]]]
TIMES[X,Y] <= IF[ZEROP[X],0,PLUS[TIMES[SUB1 X,Y],Y]]
NBM closure of NBM0 under definition by recursion principle
The axioms for NUMBERP shell are:
(1) NUMBERP[X]=T ∨ NUMBERP[X]=F
NUMBERP[ADD1[X1]]≠F
NUMBERP[0]≠F
ADD1[X1]≠0
NUMBERP[X]≠F ∧ X≠0 -> ADD1[SUB1[X]]=X
For 1≤i≤n
(2i) NUMBERP[X1]≠F -> SUB1[ADD1[X1]]=X1
For 1≤i≤n
(3i) NUMBERP[X]=F ∨ X=0 ∨ (NUMBERP[X1]=F ∧ X=ADD1[X1]) -> SUB1[X]=0
(4) NUMBERP[T]=F
NUMBERP[F]=F
For each previous shell recognizer r'
(5r') NUMBERP[X]≠F -> r'[X]=F
Let w be well-founded relation of previous shell or FALSE2 if no previous shell
where (FALSE2 X Y)=F
(6) SUB1P[X Y]≠F <-> (NUMBERP[Y]≠F ∧ Y≠0 ∧ X=SUB1[Y])
(7) declare WFO[SUB1P]
or
(6*) LESSP[X Y]≠F <-> (NUMBERP[Y]≠F ∧ Y≠0 ∧ (X=SUB1[Y] ∨ LESSP[X,SUB1[Y]]≠F))
(7*) declare WFO[LESSP]
GROUND.ZERO -- NBM + LISTP + LITATOM + ...
BOOT.STRAP.INSTRS
(ADD.SHELL0 const btm r (...(aci tri dvi)...) )
(
(ADD.SHELL0 FALSE NIL FALSEP NIL)
(ADD.SHELL0 TRUE NIL TRUEP NIL)
(DEFN0 NOT (P) (IF P (FALSE) (TRUE)) T)
(DEFN0 AND (P Q) (IF P (IF Q (TRUE) (FALSE)) (FALSE)) T)
(DEFN0 OR (P Q) (IF P (TRUE) (IF Q (TRUE) (FALSE))) T)
(DEFN0 IMPLIES (P Q) (IF P (IF Q (TRUE) (FALSE)) (TRUE)) T)
(ADD.SHELL0 ADD1 ZERO NUMBERP ((SUB1 (ONE.OF NUMBERP) ZERO)))
(DEFN0 LESSP (X Y)
(IF (OR (EQUAL Y 0) (NOT (NUMBERP Y)))
(FALSE)
(IF (OR (EQUAL X 0) (NOT (NUMBERP X)))
(TRUE)
(LESSP (SUB1 X) (SUB1 Y))))
T)
(PUT1 LESSP LEVEL.NO 0)
(DEFN0 ZEROP (X) (IF (EQUAL X 0) T (IF (NUMBERP X) F T)) T)
(DEFN0 FIX (X) (IF (NUMBERP X) X 0) T)
(DEFN0 PLUS (X Y) (IF (ZEROP X) (FIX Y) (ADD1 (PLUS (SUB1 X) Y))) T)
(ADD.AXIOM1 COUNT.NUMBERP (REWRITE) (IMPLIES (NUMBERP I) (EQUAL (COUNT I) I)) NIL)
(ADD.SHELL0 PACK NIL LITATOM ((UNPACK (NONE.OF) ZERO)))
(ADD.SHELL0 CONS NIL LISTP ((CAR (NONE.OF) ZERO) (CDR (NONE.OF) ZERO)))
(DEFN0 NLISTP (X) (NOT (LISTP X)) T)
(ADD.SHELL0 MINUS NIL NEGATIVEP ((NEGATIVE.GUTS (ONE.OF NUMBERP) ZERO)))
(DEFN0 DIFFERENCE (I J)
(IF (ZEROP I) 0 (IF (ZEROP J) I (DIFFERENCE (SUB1 I) (SUB1 J)))) T)
(DEFN0 TIMES (I J) (IF (ZEROP I) 0 (PLUS J (TIMES (SUB1 I) J))) T)
(ADD.AXIOM1 RECURSION.BY.DIFFERENCE (INDUCTION)
(IMPLIES (AND (NUMBERP I) (NUMBERP N) (NOT (EQUAL I 0)) (NOT (EQUAL N 0)))
(LESSP (DIFFERENCE I N) I)))
(DEFN0 QUOTIENT (I J)
(IF (ZEROP J) 0 (IF (LESSP I J) 0 (ADD1 (QUOTIENT (DIFFERENCE I J) J)))) T)
(DEFN0 REMAINDER (I J)
(IF (ZEROP J) (FIX I) (IF (LESSP I J) (FIX I) (REMAINDER (DIFFERENCE I J) J)))
T)
(DEFN0 LEGAL.CHAR.CODES NIL
(QUOTE (33 35 36 38 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61
62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85
86 87 88 89 90 92 94 95 97 98 99 100 101 102 103 104 105 106 107 108 109
110 111 112 113 114 115 116 117 118 119 120 121 122 126)) NIL)
(DEFN0 ILLEGAL.FIRST.CHAR.CODES NIL
(QUOTE (43 45 46 48 49 50 51 52 53 54 55 56 57)) NIL)
(DEFN0 LENGTH (LST) (IF (LISTP LST) (ADD1 (LENGTH (CDR LST))) 0) NIL)
(DEFN0 MEMBER (X LST)
(IF (NLISTP LST) F (IF (EQUAL X (CAR LST)) T (MEMBER X (CDR LST)))) NIL)
(DEFN0 SUBSETP (X Y)
(IF (NLISTP X) T (IF (MEMBER (CAR X) Y) (SUBSETP (CDR X) Y) F)) NIL)
(DEFN0 LAST (L) (IF (LISTP L) (IF (LISTP (CDR L)) (LAST (CDR L)) L) L) NIL)
(DEFN0 LEGAL.CHAR.CODE.SEQ (LST)
(AND (LISTP LST)
(SUBSETP LST (LEGAL.CHAR.CODES))
(NOT (MEMBER (CAR LST) (ILLEGAL.FIRST.CHAR.CODES)))
(EQUAL (CDR (LAST LST)) 0))
NIL)
(DEFN0 SYMBOLP (X) (AND (LITATOM X) (LEGAL.CHAR.CODE.SEQ (UNPACK X))) NIL)
(DEFN0 LOOKUP (X ALIST)
(IF (NLISTP ALIST)
NIL
(IF (AND (LISTP (CAR ALIST)) (EQUAL X (CAAR ALIST)))
(CDAR ALIST)
(LOOKUP X (CDR ALIST))))
NIL)
(DCL0 ARITY (X))
(DCL0 FORMP (X))
(DEFN0 FORM.LSTP (X)
(IF (NLISTP X) (EQUAL X NIL) (AND (FORMP (CAR X)) (FORM.LSTP (CDR X)))) NIL)
(DCL0 APPLY (X LST))
(DCL0 MEANING (X ALIST))
(DEFN0 MEANING.LST (X ALIST)
(IF (NLISTP X) NIL (CONS (MEANING (CAR X) ALIST) (MEANING.LST (CDR X) ALIST)))
NIL)
(SETUP.META.NAMES)
)
Interpreting BMA in NBM
Let p,p0,p1 be standard pairing, unpairing functions on numbers
(m,n) will denote p[m,n]+1
Sequences
<> :: 0
<m> :: (m,0)
<m0,...,mk> :: (m0,<m1,...,mk>)
P{i} ith tuple projection
x#i = P{i}[x]
<m0...mk>#i = mi i≤k
= 0 OW
Fix an enumeration of shell constructor and bottom symbols
(beginning with pseudo shells TRUE and FALSE)
for each shell "c of n arguments (with bottom element "b (if ∃btm))
let c.TAG stand for <&c n> (b.TAG for <&b> if ∃btm)
where number "c is &c (number "b is &b)
Shell elements will be tagged tuples
We define the interpretation # on function symbols of BMA
It extends to terms of BMA as follows
x# is x
f[t1,...tn]# is f#[t1#,...,tn#]
First the pseudo shells
TRUE of 0 arguments and recognizer TRUEP
FALSE of 0 arguments and recognizer FALSEP
T# <::> <<TRUE.TAG 0>>
F# <::> <<FALSE.TAG 0>>
IF#[X,Y,Z] <= IF[NOT[EQUAL[X,F#]],Y,Z]
EQ#[X,Y] <= IF[EQUAL[X,Y],T#,F#]
TRUEP#[X] <= EQ#[X,T#]
FALSEP#[X] <= EQ#[X,F#]
consider shell "c of n arguments (with bottom element "b (if ∃btm))
recognizer "r
accessors "ac{1}..."ac{n}
type restrictions tr{1}...tr{n}
default values dv{1}...dv{n}
well-founded relation "wfn
for simplicity we introduce auxiliary functions
c.TRi[xi] <= tri# (recall tri is term built from r[xi], T,F,IF
in particular tri = T <-> tri ≠ F )
c.DVi[x] <= IF#[c.TRi[x],x,dv{i}#]
b#[] <= <b.TAG>
r#[x] <= IF[EQUAL[x,b#[]],T#,
IF[AND[EQUAL[len[x],n+1],EQUAL[P{0}[x],c.TAG]],
AND#[c.TR1[P{1}[x]],...,c.TR1[P{n}[x]] ],
F#]
c#[x1,...,xn] <= <c.TAG,c.DV1[x1],...,c.DVn[xn]>
ac{i}#[x] <= IF#[r#[x],P{i}[x],dv{i}#]
Translate wfo's so that C:BMA |- t0 < t1 then C#:NBM |- t0# <# t1#
Introduce
c.cnt[x] <= IF#[r#[x],
IF#[EQ#[x,b#[]],
0,
1+c.cnt[ac{1}#[x]]+...+c.cnt[ac{n}#[x]] ]
ocnt[x] ]
where ocnt is the count function for the previous shell
ADD1.cnt[x] <= IF#[NUMBERP#[x],
IF#[EQ#[x,0#],0,1+ADD1.cnt[SUB1#[x]] ]
0 ]
(making TRUE[],FALSE[] have count 0?)
Then for a function introduced by the recursion principle
f#[x_] <= t# will be definable
Reductions -- NBM recursion + SUB1P induction ==> NBM induction
Let C be an NBM context with induction restricted to SUB1P induction
Assume the conditions for some instance of BMind -- e.g.
(a) @P be a formula (the proposition to be proved)
(b) x1,...xn be distinct variables
(c) m1...mk n-ary function symbols (m_ interpreted as above)
(d) r a WFO
(e) @Qi 1≤i≤l formulae (describing the l non-base cases)
(f) for 1≤i≤l; 1≤j≤hi s{i,j} is a substitution
(g) @Q0 <::> ¬@Q1 ∧ ... ∧ ¬@Ql (the base case)
and
(Hyp1) C |- @Qi -> [x1,...,xn]/s{i,j} <{r,m_} [x1,...,xn] for 1≤i≤l; 1≤j≤hi
(Hyp2) C |- @Q0 -> @P
(Hyp3) C |- @Qi ∧ @P/s{i,1} ∧ ... ∧ @P/s{i,hi} -> @P for 1≤i≤l
show there is NBM functional extension C' of C
C' |- @P
Proof: (ala Kreisel/Tait) Assume n=1 (or x,y,z stand for n-tuples)
Notation for sequences p[x,y] the std pairing function (x,y)=p[x,y]+1
<> the empty sequence 0
<x1 ... xn> P[x1,...P[xn,0]..]
(x,u) add x to front of u
u*v concatenate u,v
mx{y:<>}f[y] = 0
mx{y:(x,u)}f[y] = max[f[x],mx{y:u}f[x]]
Introduce J,J&,L,L&,I p&
<t{1,1}...t{1,h1}> if Q1[x]
J[x] = ....
<t{k,1}...t{k,hk}> if Qk[x]
<> OW (e.g. if Q0[x]
J&[<>] = <>
J&[x⊗s] = J[x]*J&[s]
L[x] = mx{y:J[x]}[1+L[y]] (BMrec by Hyp1)
L&[s] = mx{y:s}[1+L[y]]
I[s,0] = s
I[s,k+1] = J&[I[s,k]]
p&[s] = and{y:s}p[y]
Lemmas (provable in NBM|)
(0) I[s,k+1] ≠ <> -> I[s,k]≠<>
since I[s,k]=<> -> J&[I[s,k]]=I[s,k+1)=<>
(1) L&[s] = 0 -> s=<>
(2) L[x]=L&[J[x]] (using definitions of L and L&
(3) u≠<> -> L&[u]=1+L&[J&[u]]
L&[u]=mx{y:u}[1+L[y]] dfn L&
=1+mx{y:u}[L[y]] (u≠<>
=1+mx{y:u}[mx{z:J[y]}[1+L[z]]] dfn L&
=1+mx{z:J&[u]}[1+L[z]]] dfn J&, property mx
=1+L&[J&[u]] dfn L&
(4) I[s,y]≠<> -> L&[I[s,y]] + y = L&[s]
Induction phi(y) <::> I[s,y]≠<> -> L&[I[s,y]] + y = L&[s]
phi[0] is s≠<> -> L&[s]=L&[s]
assume phi[y] and I[s,y+1]≠<>
(4.1) I[s,y]≠<> (0)
(4.2) L&[I[s,y]]+y=L&[s] (phi[y],4.1)
(4.3) L&[I[s,y+1]]+y+1 = L&[J&[I[s,y]]]+1+y (defn I)
= L&[I[s,y]] + y (3,4.1)
= L&[s] (4.2)
(5) I[s,L&[s]]=<>
I[s,L&[s]]≠<> -> L&[I[s,L&[s]]] =0 (4) with y=L&[s])
L&[I[s,L&[s]]] = 0 -> I[s,L&[s]]=<>
(6) p&[J&[s]] -> p&[s]
definitions p&,J& and Hyp2,3
(7) p[x]
Show phi[n] <::> n≤L&[s] -> p&[I[s,L&[s]-n]] by induction on n
phi[0] is p&[<>] trivially true
assume phi[n] and n'≤L&[s]
thus p&[I[s,L&[s]-(n+1)+1]]
p&[J&[I[s,L&[s]-(n+1)]]] (defn I)
p&[I[s,L&[s]-(n+1)]] (6)
in particular p&[I[<x>,0]] ≡ p[x]
Reduction to standard form of recursion
Results from literature
Howard80
(1) if phi is defined by k applications of e-recursions on level1 (and some EDs)
then ord[phi] < exp[e,k+1] (for e=w get exp[w,k+1])
(2) if phi is defined by k applications of nested e-recursions on level1
(and some EDs) then ord[phi] < exp[2,exp[e,k+1]] (e=w w\[w\k])
(3) if phi is defined by k applications of e-recursions on level2
(and some EDs) then ord[phi] < exp[2,exp[e,k+1]] (e=w w\[w\k])
p101
(4) every functional with measure less than w\w (w\[w\w]) can be defined
by ordinary PRs on level 1 (levels 1,2)
Defining schema
x,y,z vars u,v numerical vars (level ≤ 2)
X,Y,Z seq of vars
f,g,h.. function constants introduced by ED,GPR,OR
A,B terms
(ED) hX = A[X] A[X]:0 vars among X=x1...xk;xi:si of level≤1,
consts among those previously introduced
(GPR) h0X = fX f:s (level≤2) s=(s1...sk->0) X=x1...xk xi:si
h[Sv]X = A[v,hv,X] A[v,z,X]:0 (as above) and with z:s v:0
(OR{e}) huvX = fX if ¬(0<v<u<e)
huvX = A[v,Hv,X] if (0<v<u<e)
u:0, z:0->s=(0,s1...sk->0), f,X,v,A as above
OR is on level 1 if level[z]=1 and in A z occurs in no components of level 1
(PR) h0XY = fXY
h[Sv]XY = gv[hvX]XY
level g≤2, level hvX≤1
if level hvX=0 (Y mt) have PR on level 1
if level hvX=1 (Y non-mt yεY y:0) have PR on level 2
Analysis of BM-recursive via interpreter
symbols:
variables
function symbols for each arity≥0
basic functions:
TRUE{0},FALSE{0},IF{3},EQUAL{2},
ZERO{0},NUMBERP{1},ADD1{1},SUB1{1},LESSP{2}
defined functions:
F{m,n} mth function of arity n
terms:
variables
f[t1...tk]
f[t1...tk;t_]|{m1...mk;d}
basic terms:
constructed from basic function symbols and variables
ground terms:
no variables
derivation: D = <D{1,k1}...D{m,km}> where
D{i,ki} has form F{i}[X1,...,Xni] <- body{i}
with
(1) F{i} = F{mi,ni}; ki≤ni
(2) body{i} constructed from {X1...Xni}, basic symbols and F{j} j≤i
(3) occurences of F{i} in body{i} are of the form
F{i}[t1...tk;t_]|{X1...Xk;d}
with d a basic term
The term F{i}[t1...tk;t_]|{X1...Xk;d} is a Course of Values restriction
wrt the standard ordering of type w\k e.g.
<- F{i}[t1...tk;t_] if <t1...tk> <[k] <X1...Xk>
F{i}[t1...tk;t_]|{X1...Xk;d}
<- d OW
Thus F{i} is defined by nested <[k]-recursion uniformly in F{j} j<i and basics
We define a notion of state and reduction (wrt D and interpretation of
basic symbols) (ala TAIT) that define the value of a term
Let t be a ground term and let f1....fs be a listing of the occurrences in t
of terms of the form F{i}[t_] or F{i}[t_]|{m1...mk;d}.
(s maybe 0 -- e.g. if t is basic)
The state for t then is {t;f1...fs}. Note that if values for the fj are obtained
then t can be evaluated (using interpretation of basic symbols)
A state is either a value, or the state for some term t.
If S={t;f1...fs} is a state then the successor S+ is determined as follows
S+ is value[t] if s=0
S+ is {t';f1'...fs-1'fs1...fsq} OW
where
fs is F{i}[t1...tk;t_] or F{i}[t1...tk;t_]|{m1...mk;d}
for some i and all t1,d are basic ground terms
Let v1...vk;v_ (v1...vk;v_ ;w) be the values of t1...tk;t_ (t1...tk;t_;d)
Let ts = body{i}{X1←v1...Xk←vk;X_←v_} {ts;fs1...fsq} the state for ts
if F is non-restricted or <v1...vk> <[k] <m1...mk>.
Otherwise ts=w and sq=0.
t';fj' are obtaine by replacing the fs occurrence (if any) by ts
[remark -- for real interpreter some form of virtual substitution
is probably better.]
Thus B[S,D] <- S if S is a value
<- S+ OW
computes value of term for which S is a state
B[mkstate[t],D] computes value of D-term t
w[3]-recursive?
λx_y_B[mkstate[F{i}[x_;y_],D] computes F{i}
w\(w\k) recursive?
Need an assignment of ordinals to states S such that
(0) O[S]=0 if S is value
(1) O[S+]<O[S] OW
Base on assignment o[f] st O[S] = PHI[<o[f1]....o[fs]>]
o[f] = 0 if f is basic
o[f] ≤ o[f'] f' = f{fs←ts] as above
o[fs] > PHI[<o[fs1]...o[fsq]>] fsj as above
o[f] depends on #f, f-rank[body], ¬f-rank[body]
Interpreter as Iteration for Hierarchies of recursive functions
Consider (syntactic) class of GR functions
And a-rec defined relative to standard WOs <a for a<<e0
Let E[c,x] = {c}x
Show ordinal of E = sup{c; O[c]}
BM recursion via finite approximations
Consider BM defn f[x_]=t[f,x_] with
f1...fl be the f-terms in t listed so that fi<fj only if i>j
machine {qi[x_],ti_[x_]}
measured ordering {m,r}
|- qi -> m[ti_[x_]] <[r] m[x_]
Introduce new symbols zj and define
tl'_=tl_[x_] fl'=fl
tl-1'_=tl-1_[x_]{fl<-zl}; fl-1'=f[tl-1'_]
tl-2'_=tl-2_[x_]{fl<-zl}{fl-1'<-zl-1}; fl-2'=f[tl-2'_]
...
t1'_=t1_[x_]{fl<-zl}...{f2'<-z2} f1'=f[t1'_]
t0'=t'=t{fl<-zl}...{f1'<-z1}
where {term<-zj} means replace that term by the symbol zj
note t[l-j-1]'_ has vars among x_,zl,..z[l-j] j<l
OK1[((x_),z),s] <->
ql -> DEF[s,tl'_] ∧
LET{zl<-s"tl'_}IN
ql-1 -> DEF[s,tl-1'_] ∧
LET{zl-1<-s"tl-1'_}IN
...
q1 -> DEF[s,t1'_] ∧
LET{z1<-s"t1'_}IN
z=t0'
e.g. in case fj is needed in evaluation of f (qj holds) then
fj defined by s and using the approximation s t evaluates to z
e.g. t[lam{y_}[s"y_];x_] = z
OK[<>] <-> T
OK[a.s] <-> OK1[a,s] ∧ OK[s]
The BM interpreter should provide uniform means for computing
OF[x_] st OK[OF[x_]] ∧ DEF[s,x_]
OF will be <*[m,r] recursive
lam{x_}OF[x_]"x_ will satisfy original defining equations
To compute OF[x_] start with
{(m_);(q1[m_]t1_[m_])...(ql[m_],tl_[m_])} ; <>
In state
{(m_)}
s
Return assign[(m_),t"[s,m_],s]
In state
{(m_);(q1[m_]t1_[m_])...(qj[m_],tj_[m_])...{(mk_);}...}
s
Next state is
{(m_);(q1[m_]t1_[m_])...(qj[m_],tj_[m_])...}
assign[(mk_),t"[s,mk_],s]
In state
{(m_);(q1[m_]t1_[m_])...(qj[m_],tj_[m_])...{(mk_);...(qi[mk_],ti_[mk_]}...}
s
If qi[mk_] and mk'_ = ti_"[s,mk'_] then next state is (f[ti_[mk_]] needed)
{(m_);(q1[m_]t1_[m_])..(qj[m_],tj_[m_])..{(mk_);..{(mk'_);..(ql[mk'_],tl_[mk'_])}}.}
s
If ¬qi[mk_] then next state is (f[ti_[mk_]] not needed)
{(m_);(q1[m_]t1_[m_])..(qj[m_],tj_[m_])..{(mk_);..}.}
s
Forms of definition -- ACK examples --
BM def
ACK[m,n] <- IF[Em0,n+1,IF[ En0,ACK[m-1,1],ACK[m-1,ACK[m,n-1]] ]]
ACK[m-1,1]|{m,n,0}
ACK[m-1,ACK[m,n-1]|{m,n;0}]|{m,n,0}
The machine:
q1[m,n] : m≠0∧n=0 t1_[m,n] is (m-1,1)
q2[m,n] : m≠0∧n≠0 t2_[m,n] is (m-1,ACK[m,n-1])
q3[m,n] : m≠0∧n≠0 t3_[m,n] is (m,n-1)
Functional Def
A[0] = lam{n;n+1}
A[m+1] = PR{A[m]1,lam{f,n}A[m][f]}
where PR{g,h}0 = g
PR{g,h}n+1 = h[PR{g,h}n,n]
so A[m+1]0 = A[m]1
A[m+1]n+1 = lam{f,n}A[m][f][A[m+1]n,n]
= A[m][A[m+1]n]
Finite approximation
OK[<>] <- T
OK[x,s] <- AND[OK1[x,s] , OK[s]]
OK1[x,s] <- LET{m←ppx,n←qpx,l←qx}IN
IF[NE[x,((m,n),l),F
IF[Em0,E[l,n+1]
IF[En0,APP[s,(m-1,1),l]
IF[DEF[s,(m,n-1)],APP[s,(m-1,s"(m,n-1)),l]
OW F]]]
OK0[(m,n),s] <- IF[OK[s],DEF[s,(m,n)],F]
where DEF[s,x] is NE[ASSOC[x,s]],0]
s"x is IF[DEF[s,x],q[ASSOC[x,s]],0]
APP[s,x,y] is E[ASSOC[x,s],(x,y)]
ack[s,(m,n)] <- IF[OK[s],ASSOC[(m,n),s],0]
Note that ack[s,(m,n)]≠0 -> q[ack[s,(m,n)]]=ACK[m,n]
e.g. if define A such that OK0[(m,n),A[m,n]] then
ACK[m,n] <- q[ack[A[m,n](m,n)]] satisfies defining equations for ACK
(this is provable in PR) but A not definable in PR
To construct A prove ∀m,n∃sl OK1[((m,n),l),s]
m=0 s=<>,l=n+1
m≠0 n=0 assume s0,l0 such that OK1[((m-1,1),l0),s0]
then s=s0,l=l0 will do
m≠0 n≠0 assume s0,l0 such that OK1[((m,n-1),l0),s0]
and s1,l1 such that OK1[((m-1,l0),l1)s1]
then s=s1*s0 l=l1 will do
How to formulate nicely as <[2](unnested)recursion?
Can't be done as <[k] unnested is PR
OF[(1,1)] for ACK[m,n]
Recall the machine:
q1[m,n] : m≠0∧n=0 t1"_[s,m,n] is (m-1,1)
q2[m,n] : m≠0∧n≠0 t2"_[s,m,n] is (m-1,s"(m,n-1))
q3[m,n] : m≠0∧n≠0 t3"_[s,m,n] is (m,n-1)
t"[s,m,n] <- IF[Em0,n+1,IF[ En0,s"(m-1,1),s"(m-1,s"(m,n-1)) ]]
{(1,1); (F,*) (T,(0,s"(1,0))) (T,(1,0))} <>
{(1,1); (F,*) (T,(0,s"(1,0))) {(1,0); (T,(0,1)) (F,*) (F,*)} } <>
{(1,1); (F,*) (T,(0,s"(1,0))) {(1,0); (T,(0,1)) (F,*) } } <>
{(1,1); (F,*) (T,(0,s"(1,0))) {(1,0); (T,(0,1)) } } <>
{(1,1); (F,*) (T,(0,s"(1,0))) {(1,0); {(0,1); (F,*) (F,*) (F,*)} } } <>
{(1,1); (F,*) (T,(0,s"(1,0))) {(1,0); {(0,1); (F,*) (F,*)} } } <>
{(1,1); (F,*) (T,(0,s"(1,0))) {(1,0); {(0,1); (F,*)} } } <>
{(1,1); (F,*) (T,(0,s"(1,0))) {(1,0); {(0,1);} } } <>
t"[<>,0,1]=2
{(1,1); (F,*) (T,(0,s"(1,0))) {(1,0); } } <((0,1),2)>
t"[<((0,1),2)>,1,0]=<((0,1),2)>"(0,1)=2
{(1,1); (F,*) (T,(0,s"(1,0))) } <((1,0),2) ((0,1),2)>
{(1,1); (F,*) {(0,2);(F,*)(F,*)(F,*)} } <((1,0),2) ((0,1),2)>
{(1,1); (F,*) {(0,2);(F,*)(F,*)} } <((1,0),2) ((0,1),2)>
{(1,1); (F,*) {(0,2);(F,*)} } <((1,0),2) ((0,1),2)>
{(1,1); (F,*) {(0,2);} } <((1,0),2) ((0,1),2)>
t"[<((1,0),2) ((0,1),2)>,0,2]=3
{(1,1); (F,*) } <((0,2),3) ((1,0),2) ((0,1),2)>
{(1,1);} <((0,2),3) ((1,0),2) ((0,1),2)>
t"[<((0,2),3) ((1,0),2) ((0,1),2)>,1,1]=s"(0,s"(1,0))=s"(0,2)=3
! <((1,1),3) ((0,2),3) ((1,0),2) ((0,1),2)>
BMH Heuristics -- Notes on structures and strategies of BMTP
Conjecture -> set of clauses
Outline of strategy
(1) Simplify
(2) Reformulate to eliminate undesirables (destructors)
(3) Use equalities
(4) Generalize (forget information used and (hopefully) no longer needed
e.g. -- terms that have played their role)
(5) Eliminate Irrelevant terms
(6) Induct
Info
typeset alist
binding alist
lemma data base
Let Stop be current set of clauses to simplify
Let Sbot be current set of clauses to simplified and ready for induction
PROVE[Conjecture] <=
Prog{Stop Sbot S' C}
Stop <- CLAUSIFY[Conjecture]
Sbot <- {}
TOP-OF-CLAUSE-FALL
If Stop={} GO CLAUSE-POOL
Let C,Stop <- Select-one[Stop]
S' <- Simplify-clause[C]
If S' ≠ {C} Stop<-Stop∪S'; GO TOP-OF-CLAUSE-FALL ;;;Note S' may be empty
S' <- Elim-Undesirables[C]
If S' ≠ {C} Stop<-Stop∪S'; GO TOP-OF-CLAUSE-FALL
S' <- Use-Equalities[C]
If S' ≠ {C} Stop<-Stop∪S'; GO TOP-OF-CLAUSE-FALL
S' <- Generalize[C]
If S' ≠ {C} Stop<-Stop∪S'; GO TOP-OF-CLAUSE-FALL
S' <- Elim-Irrelevance[C]
If S' ≠ {C} Stop<-Stop∪S'; GO TOP-OF-CLAUSE-FALL
C-IS-PREPARED
Sbot ← C+Sbot
GO TOP-CLAUSE-FALL
CLAUSE-POOL
If Sbot={} RETURN[]
Sbot<-CLEAN-UP[Sbot]
Let C,Sbot <- Select-one[Sbot]
S' <- Generate-Induction[C]
If S'={C} FAIL[]
Stop <- Stop∪S'; GO TOP-OF-CLAUSE-FALL
END
Simplify-clause[Clause] <=
Prog{Sin,Sout,C,old,new,lit,TSalist,val}
C <- Use and discard any literals of form (NOT(EQUAL x t))[Clause]
Sin <- { (<>,Clause) } ;;; set of old,new pairs
Sout <- {}
RESUME-SIMPLIFICATION
If Sin={} Return Sout
C,Sin <- Select-one[Sin]
new <- newpart[C]
old <- oldpart[C]
If old = {} Sout new+Sout; GO RESUME-SIMPLIFICATION
Lit,old <- Select-next[old]
TSalist <- Assume-false[old*new]
IF TSalist=must-be-false[] GO RESUME-SIMPLIFICATION ;;;Clause evap
val <- Rewrite-term[Atom-of[Lit],TSalist,MtValist,BOOL,NOhopes,<>]
If Neg[Lit] val <- Negate[val]
If val=T GO RESUME-SIMPLIFICATION ;;; clause evaporates
If val=F Sin <- (new old)+Sin; ;;; delete literal
GO RESUME-SIMPLIFICATION
Sin <- IFsplit[new,val,old]∪Sin
GO RESUME-SIMPLIFICATION
END
Could fail if <> ≡ F instead of addeing to Sout
IFsplit[new,val,old] (subsume and replace too)
Assume-False[hyps] <= Assum-f[hyps,<>] ;;; returns must-be-false[] or alist
Assum-F[<>,TSalist] <= TSalist
Assum-F[atm.hyps,TSalist] <=
LET {ty<-Assumes-t-or-f[atm,TSalist]} IN
IF ty=must-be-true[] must-be-false[]
ELSEIF ty=must-be-false[] Assum-F[hyps,TSalist]
ELSE Assum-F[hyps,false-part[ty]]
Asum-F[(NOT atm).hyps,TSalist] <=
LET {ty<-Assumes-t-or-f[atm,TSalist]} IN
IF ty=must-be-false[] must-be-false[]
ELSEIF ty=must-be-true[] Assum-F[hyps,TSalist]
ELSE Assum[hyps,true-part[ty]]
Rewrite-term[atm,TSalist,Valist,hyps,mode,hopes,div] <=
PROG{val,ts}
val <-
IF IsVar[atm] Rewrite-var[atm,TSalist,Valist,hyps,mode,hopes,div]
? ELSEIF IsExplicit[atm] atm
ELSEIF IsEq[atm] Rewrite-eq[atm,TSalist,Valist,hyps,mode,hopes,div]
ELSEIF IsIf[atm] Rewrite-if[atm,TSalist,Valist,hyps,mode,hopes,div]
ELSEIF IsRec[atm] Rewrite-rec[atm,TSalist,Valist,hyps,mode,hopes,div]
ELSE Rewrite-other[atm,TSalist,Valist,hyps,mode,hopes,div]
?p124 RETURN coerce[val,Assumed-Typeset[val,TSalist],mode]
Coerce[val,ts,mode] <=
IF mode≠BOOL
IF ts=ONE.OF{'TRUEP} 'T
ELSEIF ts=ONE.OF{'FALSEP} 'F
ELSE val
ELSEIF ONE.OF{'FALSEP}∩ts=ONE.OF{} 'T
ELSEIF ts=ONE.OF{'FALSEP} 'F
ELSE val
Rewrite-terms[<>,TSalist,Valist,hyps,mode,hopes,div] <= <>
Rewrite-terms[tm.tms,TSalist,Valist,hyps,mode,hopes,div] <=
Rewrite-term[tm,TSalist,Valist,hyps,mode,hopes,div] .
Rewrite-terms[tms,TSalist,Valist,hyps,mode,hopes,div]
Rewrite-var[atm,TSalist,Valist,hyps,mode,hopes,div] <=
IF[BoundP[atm,Valist] Value[atm,Valist] atm]
Rewrite-eq[(EQUAL t1 t2),TSalist,Valist,hyps,mode,hopes,div]
PROG{val,val1,val2}
val1 <- Rewrite-term[t1,TSalist,Valist,hyps,VALUE,Nohopes,div]
val2 <- Rewrite-term[t2,TSalist,Valist,hyps,VALUE,Nohopes,div]
IF val1=val2 RETURN 'T
IF Must-be-unequal[val1,val2,TSalist,Valist] RETURN 'F
val <- apply-EQUAL-rewrite-rules[(EQUAL val1 val2)]
val <- apply-rewrite-lemmas[val,TSalist,Valist,hyps,mode,hopes,div]
RETURN val
END
Must-be-unequal[val1,val2,TSalist] <=
OR [Typeset[val1,TSalist] ∩ Typeset[val2,TSalist] = ONE.OF{}
AND[Explicit-value[val1], Explicit-value[val2], val1≠val2 ]
AND[BTM[val1],CONSTR[val2]]
AND[BTM[val2],CONSTR[val1]]
ProperShellSubterm[val1,val2,Typset[val1,TSalist]]
ProperShellSubterm[val2,val1,Typset[val2,TSalist]]
]
Rewrite-if[(IF t0 t1 t2),TSalist,Valist,hyps,mode,hopes,div] <=
PROG{val0,val1,val2,tv}
val0 <- Rewrite-term[t0,TSalist,Valist,hyps,BOOL,Nohopes,div]
tv <- Assume-t-or-f[val0,TSalist]
IF tv = must-be-true[]
val <- Rewrite-term[t1,TSalist,Valist,hyps,mode,hopes,div]
IF tv = must-be-false[]
val <- Rewrite-term[t2,TSalist,Valist,hyps,mode,hopes,div]
OW
val1 <- Rewrite-term[t1,truepart[ty],Valist,hyps,mode,hopes,div]
val2 <- Rewrite-term[t2,falsepart[ty],Valist,hyps,mode,hopes,div]
val <- apply-if-rewrite-rules[(IF val0 val1 val2)]
RETURN val
END
Rewrite-rec[(r t1),TSalist,Valist,hyps,mode,hopes,div] <=
PROG{val,tv}
val <- Rewrite-term[t1,TSalist,Valist,hyps,VALUE,Nohopes,div]
tv <- Assume-t-or-f[(r val),TSalist]
IF tv = must-be-true[] RETURN 'T
IF tv = must-be-false[] RETURN 'F
val <- apply-rewrite-lemmas[(r val),TSalist,Valist,hyps,mode,hopes,div]
RETURN val
END
Rewrite-other[(f t1...tn),TSalist,Valist,hyps,mode,hopes,div] <=
PROG{vals,tv}
vals <- Rewrite-terms[(t1...tn),TSalist,Valist,hyps,VALUE,Nohopes,div]
IF non-rec[f] ∨ measured-subset-exval[f,vals] RETURN
rewrite-term[body[f],TSalist,Assign[formals[f],vals],hyps,mode,hopes,div]
IF fεexpanding ∨ Undef[f] RETURN (f . vals)
val <- apply-rewrite-lemmas[(f . vals),TSalist,Valist,hyps,mode,hopes,div]
IF val=(f . vals) ;;; e.g. no rewrite
val <- rewrite-term[body[f],TSalist,Assign[formals[f],vals],
hyps,mode,hopes,f.div]
RETURN Bestof[val,(f.vals)]
END
Apply-rewrite-lemmas[atm,TSalist,Valist,hyps,mode,hopes,div]
Assume-t-or-f[atm,TSalist] <=
IF IsEq[atm] Assume-eq[atm,TSalist]
ELSEIF IsRec[atm] Assume-rec[atm,TSalist]
ELSE Assume-other[atm,TSalist]
Assume-eq[(EQUAL t1 t2),TSalist] <=
PROG{ts1,ts2,a1,a2}
ts1 <- Typeset[t1,TSalist]
ts2 <- Typeset[t2,TSalist]
IF ts1∩ts2=ONE.OF{} RETURN must-be-false[]
IF Sing-sing[ts1]∧ts1=ts2 RETURN must-be-true[]
a1 <- Assume[(EQUAL t1 t2),ONE.OF{'TRUEP},
Assume[(EQUAL t2 t1),ONE.OF{'TRUEP},
Assume[t1,ts1∩ts2,
Assume[t2,ts1∩ts2,TSalist] ]]]
a2 <- Assume[(EQUAL t1 t2),ONE.OF{'FALSEP},
Assume[(EQUAL t2 t1),ONE.OF{'FALSEP},TSalist]]
IF Sing-sing[ts1] a2 <- Assume[t2,ts2//ts1,a2]
IF Sing-sing[ts2] a2 <- Assume[t1,ts1//ts2,a2]
RETURN (a1 a2)
END
Assume-rec[(r t1),TSalist] <=
PROG{ts1,a1,a2}
ts1 <- Typeset[t1,TSalist]
IF ts1=ONE.OF{r} RETURN must-be-true[]
IF ts1∩ONE.OF{r}=ONE.OF{} RETURN must-be-false[]
a1 <- Assume[t1,ONE.OF{r},TSalist]
a2 <- Assume[t1,ts1/ONE.OF{r},TSalist]
RETURN (a1 a2)
END
Assume-other[t1,TSalist] <=
PROG{ts1,a1,a2}
ts1 <- Typeset[t1,TSalist]
IF ts1=ONE.OF{'FALSEP} RETURN must-be-false[]
IF ts1∩ONE.OF{'FALSEP} = ONE.OF{} RETURN must-be-true[]
a1 <- Assume[t1,ts1/ONE.OF{'FALSEP},TSalist]
a2 <- Assume[t1,ONE.OF{'FALSEP},TSalist]
RETURN (a1 a2)
END
Typeset[atm,TSalist] <=
IF HasTS[atm,TSalist] TSassoc[atm,TSalist]
ESLEIF IsVar[atm] NONE.OF{}
ELSEIF IsEq[atm] ONE.OF{'TRUEP,'FALSEP}
ELSEIF IsRec[atm] ONE.OF{'TRUEP,'FALSEP}
ELSEIF IsCons[atm] ONE.OF{Type[atm]}
ELSEIF IsAccess[atm] Tyset[atm]
ELSEIF IsIf[atm]
LET{ty <- Assume-t-or-f[test[atm],TSalist]} IN
IF ty=must-be-true[] Typeset[then[atm],TSalist]
ELSEIF ty=must-be-false[] Typeset[else[atm],TSalist]
ELSE Typeset[then[atm],truepart[ty]] ∪
Typeset[else[atm],truepart[ty]]
ELSE
LET{(ts,args) <- Typescr[fun[atm]]} IN
FOR {i IN args}
ts <- ts∪Typeset[arg[i,atm],TSalist]
RETURN ts
END
Typescr is associated with function at definition time (after acceptance)
Updated via REWRITE lemmas of form (r (f X1..Xn))
ts is typeset for t (wrt TSalist) iff
(1) ts is typeset
(2) For every interpretation env for vars of t (satisfying TSalist)
val[t,env] ε U[ts]
(ts,pars) is Typescr for f (where f formals <= body) iff
(1) ts is typeset
(2) pars ⊂ formals
(3) if ts{i} is typeset for ti 1≤i≤n
then ts+U{i:pars}ts{i} is typeset for (f t1...tn)
(ts,vars) is Definition type set for t if
(1) vars ⊂ Vars[t]
(2) For every interpretation env for vars
val[t,env] ε U[ts] ∨ val[t,env]=val[x,env] some xεvars
Defntypeset[atm] <= Dts[atm,<>]
Dts[atm,TSalist] <=
IF IsVar[atm] (ONE.OF{},{atm})
ELSEIF IsEq[atm] (ONE.OF{'TRUEP,'FALSEP},{})
ELSEIF IsRec[atm] (ONE.OF{'TRUEP,'FALSEP},{})
ELSEIF IsCons[atm] (ONE.OF{Type[atm]},{})
ELSEIF IsAccess[atm] (Tyset[atm],{})
ELSEIF IsIf[atm]
LET{ty <- Assume-t-or-f[test[atm],TSalist]} IN
IF ty=must-be-true[] Dts[then[atm],TSalist]
ELSEIF ty=must-be-false[] Dts[else[atm],TSalist]
ELSE Dts[then[atm],truepart[ty]] ∪x∪
Dts[else[atm],truepart[ty]]
ELSE
LET{(ts,pars) <- Typescr[fun[atm]]} IN
FOR {i IN args}
ts <- ts∪tspart[Dts[arg[i,atm],TSalist]]
vars <- vars∪varspart[Dts[arg[i,atm],TSalist]]
RETURN (ts,vars)
END
Compute typescr for f formals <= body
(ts,pars) <- (ONE.OF{},{})
APPROX
Assume typescr[f] = (ts,pars) IN
(ts1,vars) <- dts[body,<>]
If ts1⊂ts ∧ vars⊂pars RETURN (ts,pars)
ts <- ts1∪ts
pars <- vars∪pars
GO APPROX
Possible Decision procedures buried in SIMPLIFY
Suppose CLAUSE ε TAUT
will SIMPLIFY |- CLAUSE≠F
is Typeset[atm,Assume-false[CLAUSE-atm]∩ONE.OF{'FALSE}=ONE.OF{}
for some/each atmεCLAUSE?
If tm is ground (or vars bound to explicits in Valist)
Rewrite-term[tm,...,Valist...] ε Explicit Values
(if non-defined functions allowed, then ground must mean
all function symbols are EVpreserving.)
Is the elementary (quantifier/quantifier-free) theory of SHELLs decidable?
Does SIMPLIFY decide the qf cases?
(Can we include the shell wfn,wfn*?)
Allowing explicit definitions won't hurt, as they can be eliminated.
Suppose in a GENTZEN system
G => atm1,...atmk
with G universal (axioms of BM context C, then
C |- atm1,...,atmk
Can we use GENTZEN proof to show BMTP will verify the corresponding CLAUSE?
(possbilly by introducing lemmas)
To show TP does it all show TP |-
(1) Any tautology
(2) Any instance of Equality axiom derivable
(3) Any instance of Shell ax
(4) Any instance of function definition
(5) closed under MP
(6) closed under MI(LESSP)
? How to force TP to `instantiate and simplify'
? How to force TP to us a given (legitimate) induction scheme
Machines
Basic f-machine for body
<... case{i} : <...s{i,j}...> ...>
where case{i} is the f-free test for branch i
case{i} s{i,j} are substitutions giving the occurrences of f on this branch
(in order for actual machine computation to agree with definition
principle must have IFs moved outermost )
Analysis of function definition:
Consider each subset of the formals and permutations there of
Search for measured lex order {r,m_} justifying introduction of f
Make modified machine entry (induction template) for each success
Data --
the measure -- r,m_,measured subset of vars
for each case{i}
let Hyp{i,j} be induction lemma hyps required to verify
that s{i,j}<{r,m_}x_ then
case{i} <- ∧{j}Hyp{i,j}
Induction schemes are constructed solely from the templates for
function symbols occuring in the clause to be proved
For given template
unchangables -- pars in measured positions that do not change on any reccall
changables -- pars in measured positions that do change on reccall
Template applies to term if
changables are distinct variables
no changable var occurs in unchangable position
The induction suggested by template that applies to term
instantiate template
prune substitutions
(1) omit subs for non-variables
(2) omit subs for unchangable
(3) omit second subs for given variable (for unmeasured position)
The scheme is decorated with score and list of terms for which it accounts
Score -- how well does resulting scheme match term(s) it accounts for
Subsumtion
Merging -- add tests to case analysis, extend substitutions
General thoughts about TP data machine
The ideas is to operate on set of conjectures trying to reduce it
to empty set (by reducing clauses to T)
Work in `history' H
L(H) the language of H
PBM
SHELLs(H)
DEFNs(H)
DECLs(H)
Facts stored as
lemmas of various types
typeset info
about shell function - rec,const,btm,ac,typerestrictions
typeprescription for defined function
info about type of function symbol
shell functions
shell.rec, shell.const, shell.ac,shell.btm,shell.wfn /measure
ed, rd, ud (e.g. a function parameter)
induction machine for rd functions
Keeping track
Context of TSassumptions
Virtual substitutions
Explicit values/explicit value preserving
Goals
Hyps attempting to prove
Functions opened for examination
Value or Truth
Hoping to prove or refute? (hyp)
Notes on heuristics
p87 it is assumed that luser formulates conjectures in the most general way
p107
T1** (EQUAL (VALUE X A) (VALUE (NORMALIZE X) A))
together with T2*,T3* and FALSIFY1.FALSIFIES
will in principle prove T5*, but in practice
T1** is loosing form of rewrite rule -- will go down the infinite descent tube
Notice the difference in the way
E1. (EQUAL (EQUAL t1 t2) (EQUAL s1 s2))
and
E2. (IMPLIES (EQUAL s1 s2) (EQUAL t1 t2))
are used
rewrite throughout vs reduction at top level of expression to alternate form
p111. implicit ∃ why not try harder (some sort of unification to find ∃var)
p126. replacement of {<(NOT P) l1 ... lk> <P l1 ... lk>} by {<l1 ... lk>}
logically equivalent but may be harder to prove
it the reason we don't loose because essentially any useful
case analysis of this sort will (or can) be forced by
by rewrites, eliminations, etc?
Notes on BM
Note that BM present theory taking notion of WFO as basic and
justify definition by recursion and proof by induction using WFO.
Basic WFO are those determined by shell definition. Additional
constructions are the measured lexicographic orderings over the basic
In extensions of NUMBERP we may wlog consider only LESSP as
basic wfn ana introduce additional shells with the basic COUNT measure function.
function is represented by structure that includes defining equation
and reason for accepting the definition (all reasons can find)
and typeset info (e.g. a type prescription)
Consider `normal form' thm for a-recursion:
Given CT and code (e,a) assign ordinal &a to computation tree
ala Kleene-Brower/Tait to obtain linear ordering
Then (e,a) is &a-recursive.
Should be same (or simple related to TI[a] |- ∃v. E(e,a,v)
See Howard
Take sup{a:ENV;Ord[e,a]} for Ord*[e]
Similarly for applicative subtree (should get essentially same structure)
Compare to Ordinals assigned by Moschovakis.
Probably have to restrict means of assignment, in order to get non-trivial
classification.
Going from e,a to sup should explicate for example the ACK function
as sup of seq of PR functions.
Note on Routledge
[CLT -- further analysis -- study constructions, proofs to see what is
required to formalize them. What can be inferred from that about the
content of the result? How to reduce to PR WO? ]
Systems of arithmetic and recursive functions to consider.
Fix a standard representation of ordinals <e0 giving canonical notion
of <a for a<e0 etc.
(1)
Let PA[a] be Peano Arithmetic with TI[b] b<a. Where TI[b] is induction on <b
for Sigma[0,1] formulae.
Let e[E;phi] be encoding of recursive definition of phi via equations E
(ala kleene) and T[e,x,y] express that y is derivation of `phi[x]=y' from E.
(More generally have T/n[e,x1...xn,y] for n-ary phi.)
Then if PA[a] |- ∀x∃yT[e,x,y] we say that phi is PA[a]-provably recursive
and we may introduce function PHI with defining equations E
(2)
Like (1) but PA[n] with ordinary induction restricted to Pi[0,n] formulae
(P[0,2n?])
(3)
RA[a] -- recursive arithemtic, QF, with b-induction rule for b<a and
b-recursive definition for b<a
[need precise notion of b-recursive definition ]
[From Tait68 Nested]
Let e code recursion equations for phi and T be the relevant T-predicate.
By Ackerman's analysis of proofs in Zu[H-B], if
** ∀x,y∃zT[e,x,y,z]
is provable then phi is <w[r]-recursive for some r.
Kreisel52 shows that if phi is nested-<w[n]-recursive then ** is provable.
[In fact w[n]-provable?]
Would like to have that <w[n] |- ∀xy∃zT[e,x,y,z] implies phi/e is <w[n+1] rec
More generally if <a |- ∀xy∃zT[e,x,y,z] implies phi/e is <w\a rec
Notes
α-recursion --
fixed (pr) wfo of type α + rank function
vs
arbitrary (pr) wfo of type α
Some (more or less) standard notation for ordinals, schema, systems
Notation:
x,y,z range over `N
a,b,c,...over PR-relations -- write x <a y for a[x,y]=0
J,K,L standard PR pairing
p/i ith prime
Irreflexivity 01. PRA0 |- x <a y -> y /<a x
Transitivity 02. PRA0 |- x <a y ∧ y <a z -> x <a z
Zero-is-least 03. PRA0 |- 0 <a x'
Linear 04. PRA0 |- x <a y ∨ x=y ∨ y <a x
Unless otherwise noted assume a,b,... satisfy 1-3. e.g. are SO.
Course of values function for a ( written x|y if a is fixed by context )
x if x <a y
CV[a;x,y] <-
0 OW
The induction rule for a
I{a} A[0] A[CV[a,t,x]] -> A[x] // A[x]
The induction schema for a?
S{a} A[0] ∧ A[CV[a,t,x]] -> A[x] // A[x]
The recursion rule for a.
R{a} F[c,0] <- G[c,0]
F[c,x'] <- H[c,x,F[c,CV[a,t[c,x],x']]]
Say I[a];R[a] valid for S if there is no function f definable in S st
It{f;n;m} >a It{f,n;m'} for all m
PRA
PRA[a]
PA
T
Summary of results (translated in to std notation)
[From Tait68 Nested]
Let e code recursion equations for phi and T be the relevant T-predicate.
By Ackerman's analysis of proofs in Zu[H-B], if
** ∀x,y∃zT[e,x,y,z]
is provable then phi is <w[r]-recursive for some r.
Kreisel52 shows that if phi is nested-<w[n]-recursive then ** is provable.
[In fact w[n]-provable?]
Would like to have that <w[n] |- ∀xy∃zT[e,x,y,z] implies phi/e is <w[n+1] rec
More generally if <a |- ∀xy∃zT[e,x,y,z] implies phi/e is <w\a rec
From Howard68
For lambda-calculus notation for terms of G\:odel's T, define a reduction
relation on terms and an assignment of ordinals less that e0 to terms
such that reduction lowers the corresponding ordinal.
Claims - can formalize notions in PRA in such a way that computability of
terms ala [Tait Intensional Interpretations of Functionals of Finite Type]
follow from descending chain principle for ordinals <e0.
[CLT summary -- the technique is to define a class of expressions (that
can be interpreted as ordinals), and class of expression `vectors'
with operations corresponding to abstraction and application.
Terms are be coded a expression vectors. An element of the
expression vector is used via the ordinal interpretation of expressions
to assign ordinals to the code and thus to the term.
With a suitably restricted notion of reduction, can find assignment
that uses ordinals <w[k] for terms of PR-level at most k.
Note that level[R{(0)((0)(&s)&s)(&s)&s}] = level[&s]+2
thus T[1] = terms with no recursion.
T[2] = terms with recursion with type 0 parameters
etc.
]
From Schwichtenberg75
PRA ::: Prim Rec Arithmetic [quantifier free]
Z ::: Arithmetic
T ::: G\:odels QF system PRA of finite type
X |: f reads the theory X defines the function f
[a/l]-recursion a-recursion of type level l
Proof theoretic strength added to QF systems by
(1) Definition of functions via <a-recursion
(2) Definition of PR-functionals of higher type
Gentzen New (Con[Z]) can be formalized in
PRA + TI[<a] a<e0 ::: PRA[TI-<e0]
PRA + Rec[<a] a<e0 [Kreisel Proof by Ti in QFsystems]
->
Con[PRA <e0] implies Con[Z]
Z[Rec-<e0]] is conservative extension of Z [Hilbert/Bernay Grund]
PRA[TI-<e0] subtheory Z[TI-<e0]
?CLT Z[TI-<e0] is conservative extension of Z [Gentzen Prov and non-Prov]
PRA[TI-<e0] subtheory Z[TI-<e0]
->
Con[Z] implies Con[PRA <e0]
Z can be interpreted in T [G\:odel Dialectica]
->
Con[T] implies Con[Z]
Con[Z] implies Con[T] [Kreisel Inessential ext HA]
Compare higher type vs higher ordinal recursion
PRA[<e0] |: f implies T |: f [Kreisel Math.Logic 3.4 using G\:odel ]
exp[2,a]-(CV)recursion <- a-(CV)recursion [Tait Constructive]
constructs higher type functional
also reduction of w+1-(CV)recursion to PR at higher type
T |: f implies PRA[<e0] |: f [follows from Kreisel Inessential]
[Tait Inf long terms]
Interpreter for terms representing f of degree ≤ k
(of finite type) is 2{k,w*2} recursive
[Howard Ordinals for PRfunls]
[Parsons Restricted Induction Schemata]
Bibliography: Ordinal Recursion, Induction, and quantifier complexity in
Formal systems
Ackerman,W.[1940]
Zur Widerspruchsfreiheit der reinen Zahlentheorie
Math. Ann. 117
pp.162-194.
[according to Tait[Nested]]
Let e code recursion equations for phi and T be the relevant T-predicate.
By Ackerman's analysis of proofs in Zu[H-B], if ∀x,y∃zT[e,x,y,z] is provable
then phi is <w[r]-recursive for some r.
Axt,P.[1959]
On a subrecursive hierarchy and primitive recursive degrees,
TAMS 92,
pp.85-105.
Feferman,S.[1962]
Classification of Recursive Functions by means of Hierarchies
TAMS 104,
pp. 101-122.
Motivation, background [see Kleene EE]
Summary of results: Considers certain binary relations << on functions and
related hierarchies of functions {&r{d} | d:O } st
(1.1) c <{O} d $imp &r{c} << &r{d}
(1.2) For any &phi:GR can find d:O with |d| = exp[&w,2] and &phi << &r{d}
(1.3) Can find paths P in O with |P| = exp[&w,3] and such that
for any &phi:GR can find d:P such that &phi << &r{d}
Further for any &k<(=)&w1 can find such P with |P|=&k+exp[&w,3] (=&w1)
(1.4) Can find incomplete paths P in O, e.g. P with |P| = &w1 and such that
for any d:P there is &phi:GR such that &r{d} << &phi
(1.5) The &r{d} in Kleene sub-rec hierarchy with |d|<exp[&w,2] are ordinal
recursive wrt. `natural ordering' of type exp[&w,exp[&w,,3]]
(1.6) The &r{d} in the majorizing hierarchy with |d|<exp[&w,2] are primitive
recursive.
(1.7) There is set &Delta fo recursive functions densely ordered by <<
Uses `non-standard' extensions of hierarchies.
Remarks on similar technology developed in work on progressions of theories
Gentzen,G.[1938]
New version of the Consistency Proof for Elementary Number Theory,
Fors.z.Logik u.z.Grund.d.exakten Wis. New Series 4 Leipzig (1938)
pp. 19-44.
(#8 in Szabo,M.(ed.) "The Collected Papers of Gerhard Gentzen" N-H (1969))
Gentzen,G.[1943]
Provability and Nonprovability of Restricted Transfinite Induction in
Elementary Number Theory,
Math.Ann. 119
pp. 140-161.
(#9 in Szabo,M.(ed.) "The Collected Papers of Gerhard Gentzen" N-H (1969))
Gerber,H.[1967]
An Extension of Sch\:utte's Klammersymbols
Math.Ann. 174
p.203-216.
Goodstein,R.L.[1957]
"Recursive number theory,"
N-H (1957)
Grzegorczyk,A[1964]
Recursive Objects in all Finite Types,
Fund.Math. 54
pp. 73-93.
(TAIT67IFFT: G's treatment of IPR functionals via combinators similiar to T's)
Guard,J.R.[1962]
A Classification of Ordinal recursive functions,(abstract)
NotAMS 9
p. 338.
Guard,J.R.[1962]
Hierarchies of recursive arithmetics,(abstract)
NotAMS 9
p. 339.
[Tait65X claims]
Announce that I[a*b] not in general derivable from I[a],I[b] in PRA like systems.
Hilbert,H.H. Bernays,P.[1939]
"Grundlagen der Mathematik I,II"
Springer Verlag (1939,1970)
Howard,W.A.[1968]
Assignment of ordinals to terms for primitive recursive functionals of finite type,
BUFFALO68
pp. 435-442.
Kleene, S.C.[1958]
Extension of an Effectively Generated Class of Functions by Enumeration,
Colloq.Math. 6,
pp. 67-78.
1. Given recursive &phi how can relate form of recursion to natural ordering
on n-tuples given by x_ << y_ iff &phi[x_] used in computation of &phi[y_] ?
For forms ala Peter, higer recursion corresponds to higher order type.
Hilbert theory of recursions using higher type variables possible method
of connecting recursive functions and ordinals.
Present paper proposes method of associating ordinals with recursive functions
Kino,A.[1968]
On Provably recursive functions and Ordinal recursive functions
J.Math.Soc Japan 20#d3
pp.465-476.
For suitable @T the following theorems are proved.
Theorem 1. Let &phi be a provably recursive function in @T (e.g. for
code e of &phi @T |- $A{m_; $E{n; T_[e,m_,n] }} ) then can find
s:O[@T] and lt[m,n] representing <[s] for some arithmetization of
the system such that &phi is lt-recursive.
Theorem 2. If lt is provable PR-WO in @T and &phi is lt-recursive then
&phi is provably recursive in @T.
Proof of 1 involves defining procedure for transforming proof of
recursiveness in @T to `normal form', obtaining realization of
existential variable, then arithmetizing and showing resulting definition
can be cast as lt-recursion where s is od of proof.
Proof of 2 is straight forward using induction on definition of &phi
and provability of TI[lt] were (VI) is involved. (Need ordinary
induction for clause (V) the ordinary PR definition scheme)
Relation of G\:odel PR functionals.
[G\:odel Dialectica]
claims every lt-recursive function for |lt| < &e0 is PR functional
[Kreisel JSL 24 (abstr.) 1959]/[Tait JSL 24 (abstr.) 1959]
showed every type (0,0)-PR functional is lt-recursive for some WO |lt|<&e0.
Thus <[&e0]-recursive and PR-(0,0) are equivalent in arithmetic.
Kreisel, G.[1951]
On the Interpretation of non-Finitist Proofs-Parts I,II
JSL 16 (1951)
pp. 241-267.
JSL 17 (1951)
pp. 43-58.
[From Tait68 Nested]
Let e code recursion equations for phi and T be the relevant T-predicate.
**: ∀x,y∃zT[e,x,y,z]
Kreisel52 shows that if phi is nested-<w[n]-recursive then ** is provable.
in Zu[H-B]. (Using Gentzen result that <w[n]- induction is derivable.)
Using Ackerman's analysis of proofs in Zu[H-B], if ** is provable then
phi is <w[r]-recursive for some r.
Kreisel G.[1959]
Inessential Extensions of HA by means of Functionals of Finite Type(abstract)
JSL 24
pp.284.
Kreisel,G.[1959]
Proof by transfinite induction and definition by transfinite recursion
in quantifier free systems (abstract),
JSL 24 (1959)
pp.322-323.
Kreisel,G.[1960]
Ordinal Logics and the characterization of informal concepts of proof
Proc.Int.Cong.Mathematicians (1958)
pp.289-299.
Kreisel,G. and Tait,W.W.[1960]
Induction and Recursion (abstract),
JSL 25,
pp. 385.
Kreisel,G. Schoenfeld,J. Wang,H. [1960]
Number theoretic concepts and recursive well-orderings,
Arch.f.math.logik u. Grund. 5
pp 42-64.
(referred to as H O and W in GK[informal])
Kreisel,G. and Tait,W.W.[1961]
Finite Definability of Number-theoretic Functions and Parametric Completeness
of Equational Calculi,
Zeitschr. j. Math. Logic und Grundlagen d. Math. 7,
pp. 28-38.
Kreisel,G. Levy,A.[1968]
Reflection Principles and their Use for Establishing the Complexity of
Axiomatic Systems,
Zeitschr.f.Math.Logik..14(1968)
pp. 97-142.
L\:ob,M.H. Wainer,S.S.[1970]
Hierarchies of number-theoretic functions I.,II.
Arch.Math.Log. 13,
pp. 39-51 (I), pp. 97-113(II).
Myhill,J.[1953]
A Stumbling Block in Constructive Mathematics (Abstract),
JSL 18
p.190.
Paris,J.B. Kirby,L.A.S.[1978]
Sigma↓n-Collection schemas in arithmetic,
in Macintyre,A., Pacholski,L. and Paris,J. (eds.) "Logic Colloquium `77", N-H 1978
pp. 199-210.
Parsons,Ch.[1966]
Reduction of Inductions to Quantifier-free induction (abstract)
NotAMS 13
p. 740.
Parsons,Ch.[1966]
Ordinal Recursion in partial systems of number theory (abstract)
NotAMS 13
pp. 857-858.
Parsons,Ch.[1968]
On A Number Theoretic Choice Schema and Its Relation to Induction
BUFFALO68
pp. 459-473.
Parsons,Ch.[1971]
Proof-theoretic analysis of restricted induction schemata (abstract)
JSL 36
p. 361.
Parsons,Ch.[1972]
On n-Quantifier Induction,
JSL 37
pp. 466-482.
Routledge,N.A.[1953]
Ordinal Recursion,
Proc.Cambridge Phil.Soc. 49
pp.175-182.
Sanchis,L.[1967]
Functionals Defined by Recursion,
Notre Dame Journal Formal Logic 8,
pp. 161-174.
Schmerl,U.R.[1979]
A fine structure generated by reflection formulas over PRA
LC78
pp. 335-350.
Schwichtenberg,H.[1973]
Elimination of Higher Type Levels in Definitions of Primitive Recursive
Functionals by Means of Transfinite Recursion
LC73
pp. 279-303.
Tait W.W.[1959]
A Characterization of Ordinal Recursive Functons, abstr
JSL 24
pp.325.
Tait W.W.[1961]
Nested Recursion,
Math.Annalen 143
pp.236-250.
Tait W.W.[1965]
Functionals Defined by Transfinite Recursion,
JSL 30 (1965)
pp. 155-174.
Tait W.W.[1965]
Infinitely Long terms of Transfinite Type,
in Crossley,J.N. and Dummet,M.A.E.(eds.) "Formal Systems and Recursive Functions"
N-H(1965)
pp. 176-185.
[Tait 1967]
Tait W.W.,
Intensional Interpretations of Functionals of Finite Type I.
JSL 32 (1967)
pp. 198-212.
Turing,A.M.[1939]
Systems of logics based on ordinals
Proc.London Math.Soc. (2) 45
pp. 161-228.
Tait,W.W.[1968]
Constructive Reasoning,
LMPSIII,
pp.185-200.
Wainer,S.S.[1970]
A classification of the ordinal recursive functions,
Arch.Math.Log. 13,
pp. 136-153.
Wainer,S.S.[1972]
Ordinal Recursion, and a Refinement of the Extended Grzegorczyk Hierarchy,
JSL 37,
pp. 281-292.
Wilkie,A.J.[1978]
Some Results and Problems on Weak Systems of Arithmetic,
LC77
pp. 285-296.
Further work (see Shepherd 1965) on algebraic characterization of
consequences of induction rules.
Motivation for study: independence results and search for number theoretic
statements having direct proofs. (see Kreisel)
Theorem 3.3 (partial characterization of open consequences of free-variable
induction axioms.)
Key lemma: System with all open induction axioms has same universal consequences
as system with inductions axioms of form: ∀x∃y[ny≤x<n[y+1]] (n=1,2,....)
The only universal theory implying P{1} (universal parameter free induction)
is the set of all true universal sentences.
Parsons in LMPS V (Hintikka-Butts ed)
What is the Iterative Conception of Set?
Hinata,S.[1967]
Calculability of primitive recursive functionals of tinite type
Science reports of the Tokyo Kyoiku Daigaku,A,9 pp.218-235.
Other files
O[GRT,CLT] Ordinal arithmetic/ordinal notations
OREC[GRT,CLT] Various classes of (recursive) functions
FRAG[GRT,CLT] Fragments of PA (systems with quantifiers)
PRA[GRT,CLT] Quantifier free systems -- PRA and extensions
higher recursion, higher type, ordinal recursion
Ref to get
∂19-Sep-81 1355 CLT
To: boyer at UTEXAS-20, moore at UTEXAS-20, CLT at SU-AI
I just wanted to let you know of a project I have begun, relating to
your theorem prover (henceforth referred to as TP). Basically,
it amounts to a `proof theoretic' analysis of exisiting TP,
seeing where existing work in logic has applications, seeing where
TP (or its metamathematics) has implications for work in formal systems, etc.
The current plan is something like:
(1) embed TP in a standard logical theory and see what class of functions
are admitted and what class of theorems provable.
A natural choice is Primitive Recursive Arithmetic extended by rules for
ordinal recursions/inductions for a family of standard encodings for ordinals <e0.
(2) relate the classification of (1) to a fragment of Peano Arithmetic
(induction for restricted class of formuale such as Pi{0,2} e.g. AE)
and to systems with higher types (such as Godel's T).
(3) look at some natural generalizations of TP logical theory --
induction/recursion on larger ordinals
introduction of higher types
???
(4) see how TP and possible generalizations suggest interesting and
useful systems of other formal systems.
(1) and (2) are basically a matter of extracting results from
existing literature, and looking for degeneracies, etc.
I have a pretty good idea of what the case is.
For (3) candidate generalizations are not lacking.
What seems interesting is how they interact with existing heuristics
and whether there are some useful new heuristics for particular
generalizations. Also where are these generalizations useful.
You mention in a footnote in your book that e0 was considered
(and I gather abandoned). Have you thought about heuristics
for generating candidate orderings of ordinal type greater than
those generated by LEX from <? Seems like an interesting,
but perhaps hard problem. Also, have you considered computations
defined using functionals of higher type?
For (4) I have in mind Feferman's FM systems -- TP embeds naturally in
the full system and suggests an interesting subsystem.
In fact this all started when I loaned Feferman your book.
He was impressed and intrigued with TP,
but balked at reading axioms, proofs,... written in S-expressions.
(Hence the initial motivation for embedding TP in PRA+.)
Feferman has just finished writing up his work on FM and will send you
a copy. I hope to finish a first pass on (1) and (2) soon. When I
get something written, I'll send it along.
Hmmm, this is longer than I expected, but I thought it would be good
to let you know what we are doing.
How's Texas?
Regards,
Carolyn
∂24-Sep-81 1340 CL.BOYER at UTEXAS-20 ONCE AGAIN
Date: 24 Sep 1981 1513-CDT
From: CL.BOYER at UTEXAS-20
Subject: ONCE AGAIN
To: CLT at SU-AI
Here is a try with another mail system. Let me know if it works.
Thanks for your message. We are flattered and delighted to
have our theorem-prover scrutinized in the way you plan. Here are
some comments on your message.
We suspect that you could embed our system into the result
of adding recursions/inductions over omega↑omega to
Primitive Recursive Arithmetic. We haven't really spent
more than a few moments speculating about going up to e0.
Once or twice we've wished we could try Gentzen's proof of
the consistency of elementary arithmetic, but that was long
ago and we have just had too many other things to do just
with omega. However, if you will recall (from Chapter III)
our induction principle is based on the idea of well-founded
functions. It is possible to define the less-than relation
over e0 as a primitive recursive function in our logic.
If we assumed that that relation were well-founded (just as we
assumed LESSP was well-founded) then inductions over e0 would
be available. In the implementation, the list of well-founded
relations can be set by an act of God; thus it is not hard
to imagine experimenting with such inductions. Of course, you'd
have to define functions that recursed over e0 to "suggest"
to the theorem-prover to try such an induction. But those
functions would be admitted if the less-than relation over e0
were assumed well-founded.
You ask about computations using functionals of higher
types. We have done some work in that direction. We have a
paper "Metafunctions: Proving Them Correct and Using Them
Efficiently". Do you have a copy? If not, we'll send you
one. In that paper, we get some of the benefits of
functions that take functions as arguments without actually
paying the price of changing the theory to permit variables
to range over functions.
I (Boyer) attended the Logic Seminar last year when Feferman
gave his first talk on his new system. While I was
encouraged that a legitimate logician would see the value of
studying logical systems from within a simple formal system,
I was disappointed that Feferman didn't just adopt some
version of PURE LISP. PURE LISP is a convenient and lucid
vehicle for describing and examining formal systems.
We are delighted that you loaned ACL to Feferman. If you see
him, tell him we're honored to have our work read. (How
hard did he try to read the S-expressions?) We look forward
to getting a copy of his work on FM.
We'd like to help you however we can with your study of our
system. You are welcome to send us questions, phone us
((512)471-7097 office, (512)471-1901 messages), or visit us.
We always have the current version of our theorem-prover
running wherever we are, and you're welcome to an account
here.
Bob & J
-------
∂27-Sep-81 2018 CLT
To: boyer at UTEXAS-20, moore at UTEXAS-20, CLT at SU-AI
Thanks for your enthusiatic reply and offers of `aid and abetment'.
I shall probably take advantage of most of them eventually.
The work will probably progress in discrete quanta as I seem
to be time sharing several projects. Don't think I've given up
if there are intervals when you here nothing.
I don't have a copy of "Metafunctions: Proving Them Correct and Using Them
Efficiently" and would like to receive on.
I think an account for using the theorem prover would be quite
helpful. Occasionally experimenting can be very useful.
How difficult are the `acts of God' (declaring WFRs) to perform?
At some later time it would probably be interesting to experiment with e0.
(I had conjectured that in some sense introducing additional WFRs was
straightforward. One thing I want to explore is how TP might be
taught new constructions (in addition to LEX) and how to use
them effectively. )
I have a couple of simple questions.
(1) Is n=0 allowed in Shell definitions?
(2) With regard to type restrictions -- are they intended to describe type sets?
In the description of the shell principle it seems that tri is formed
from Xi,TRUE,FALSE, using shell recognizers and IF -- which seems
to allow things like (IF Xi Xi (r' Xi)).
Various things seem to workout more neatly if tri is constructed from
TRUE,FALSE,(r' Xi) for allowed shell recognizers r' using IF.
Comments?
You mentioned disappointment that Feferman didn't use some version
of PURE LISP. I am not sure what you mean. Does this refer
to his choice of a recursion theory within the system of
inductively generated classes or are you thinking of a PRA
like system, but based on a paring structure rather that numbers?
Or yet something else. I am curious. I find the context of
Inductively generated classes extended by an applicative theory
quite `comfortable' to work in.
With regard to the S-expression notation - I don't know how hard he
tried, but he seemed quite baffled (maybe it was the pretty printing -
seems to be a foreign notion). I don't know why he didn't
immediately connect it with the `labeled trees' he used in the FM paper.
The FM paper is ready to mail, should I send 1 or 2 copies?
∂29-Sep-81 1603 CL.BOYER at UTEXAS-20
Date: Tuesday, 29 September 1981 18:00-CDT
From: CL.BOYER at UTEXAS-20
To: Carolyn Talcott <CLT at SU-AI>
Cc: moore at UTEXAS-20
Here are a few responses. The quickest way for you to get a
copy of our meta functions paper is to look at
[UTEXAS-20]<CL.BOYER>META.DOC. You're set up as CL.TALCOTT,
password clt, on UTEXAS-20.
The thing that bothered me the most about Feferman's system
was that he only had one "bottom" object and one
constructor. I like to have literal atoms around, and I
like to have the numbers be distinct from them.
∂30-Sep-81 0740 CL.MOORE at UTEXAS-20
Date: Wednesday, 30 September 1981 09:22-CDT
From: CL.MOORE at UTEXAS-20
To: Carolyn Talcott <CLT at SU-AI>
Cc: boyer at UTEXAS-20, moore at UTEXAS-20
I have sent you a copy of the metafunctions paper in the US Mail.
It is not difficult to add new well-founded relations to the
theorem-prover. One merely defines the function (e.g.,
LESSP) and then puts its name on a certain list. Right now,
LESSP is the only name on that list. Built into the code is
the fact that lexicographic combinations of functions on
that list are well-founded.
n=0 is allowed in shell definitions.
In fact, to get the axioms for TRUE and FALSE added (during
the "boot strapping" of the system before the user first
sees it) we just "add the shell TRUE of no arguments with
recognizer TRUEP" and "add the shell FALSE of no arguments
with recoginizer FALSEP", which makes (TRUE) and (FALSE)
objects of two distinct shell types. (However, we have
arranged so that the COUNTs of (TRUE) and (FALSE) are 0
rather than 1 as for other constructors of no args.) It
would have perhaps been more convenient here had we allowed
shells with a bottom object but no constructor, but we don't
do that.
Yes, type restrictions were intended to describe type sets.
Our version of it in the book permits one to test Xi against
FALSE (but not TRUE!) while your rephrasing of it permits
neither tests against TRUE or FALSE since (in the book)
there are no recognizers for those two objects. Thus, if type
sets may contain TRUE and FALSE (as permitted in the book)
both your restriction and ours fall short of permitting the
description of all type sets. However, if we both agree to
admit TRUEP and FALSEP as recognizers (as the implementation
actually does) then I think the two versions are equivalent
because the tri are only used in Boolean positions of our
shell axioms, e.g., in IMPLIES or the test of IFs, where the
only thing that matters is whether the value delivered is F
or not. The odd example you observed we permit (IF X X (r'
X)), if used in such a position, is equivalent to (OR (NOT
(FALSEP X)) (r' X)), which is either (NOT (FALSEP X)) or is
T depending on whether r' is not FALSEP.
You can send us just one copy of the FM paper.
Thanks again. Let us know if you need anything.
J
∂05-Oct-81 1100 CLT to BOYER@UTEXAS,MOORE@UTEXAS
The meta paper has arrived and I am reading it. Will postpone
comments and questions till I have had more time to think.
What I need to know now are the incantations for invoking the
theorem prover, plus any pointers to online documentation, etc..
I have a little documentation for an SRI version (summer 78).
FM is on its way.
With regard to "bottom" objects, Feferman now takes the
intended domain to be the closure under pairing of
a class of `urelements'. Progress is being made!
In my applications I paramaterize by an abstract structure
(the `data machines' or data types) and close under pairing.
∂13-Oct-81 1517 CL.BOYER at UTEXAS-20 Our theorem-prover
Date: Tuesday, 13 October 1981 17:15-CDT
From: CL.BOYER at UTEXAS-20
To: CLT at SU-AI
Subject: Our theorem-prover
Sorry for the long delay in answering. Our theorem-prover is now available
on UTEXAS-20. The file [UTEXAS-20]<CL.TP1>CODE.TXT describes the main
changes to the theorem-prover's user interface since the users manual was
printed two years ago.
Thanks for the Feferman article.
∂06-Dec-81 2153 JMC Brown thumb
To: CL.MOORE at UTEXAS-20, CL.BOYER at UTEXAS-20
CC: CLT at SU-AI
Exercising my well-known brown thumb, I got Carolyn to try
(EQUAL (AP U (AP V U)) (AP (AP U V) U))
where AP had the usual definition of APPEND. As I feared,
your prover failed and eventually gave up. The results
are given in AP[F81,JMC]@SU-AI.
∂06-Dec-81 2209 JMC more on brown thumb
To: cl.boyer at UTEXAS-20, cl.moore at UTEXAS-20, CLT at SU-AI
I note that the conclusion of formula 1.1.3 is never true. Can this
happen naturally, or does it indicate a bug?
∂07-Dec-81 0817 CL.MOORE at UTEXAS-20 more on brown thumb
Date: Monday, 7 December 1981 10:13-CST
From: CL.MOORE at UTEXAS-20
To: John McCarthy <JMC at SU-AI>
Cc: cl.boyer at UTEXAS-20, CLT at SU-AI
Subject: more on brown thumb
Instances of the associativity of AP are what we use when
someone asks us "what's the simplest theorem your system
can't prove." It just doesn't do well on instantiated
versions of beautiful theorems. Of course, had you first
had the system prove the associativity of AP in its general
form (as a REWRITE lemma using PROVE.LEMMA) all instances of
it would have been proved immediately.
The fact that the theorem-prover produced an invalid subgoal
is not unusual nor cause for worry. The system sometimes
generalizes the formula to be proved. This sometimes
produces a goal that is not a theorem. Assuming the
theorem-prover is sound this just means the system will
fail to find a proof.
Generalizations can happen in several ways. The two most
common are by the elimination of some hypothesis (in our
"cross-fertilization" equality substitution heuristic -
Chapter XI of our book A Computational Logic) and by the
replacement of a nonvariable term by a new variable (in our
"generalization" heuristic -- Chapter XII).
In your example the theorem-prover generalized after the
first induction. This produced formula 1.1, which is still
a theorem and which the theorem-prover tried to prove by a
second induction. The induction step of that second
induction led to the non-theorem *1.1.3. Here are the last
few steps in the production of *1.1.3. Bracketed comments
explain where we lost the validity of the formula.
(IMPLIES (AND (EQUAL (AP X (AP W (CONS Z X)))
(AP Y (CONS Z X)))
(EQUAL (AP X (CONS D (AP W X)))
(AP Y X)))
(EQUAL (AP X (CONS D (AP W (CONS Z X))))
(AP Y (CONS Z X)))).
[This formula, being the inductions step for a valid
formula, is valid. However, note that if the first
hypothesis is true the second is false and vice versa. In
addition, if the first hypothesis is true, the conclusion is
false. In the next step we substitute, into the conclusion,
the left-hand side of the first hyp for the right hand side.
This makes the conclusion false. If we did nothing more the
resulting formula would still be valid because it has
contradictory hypotheses. But we throw away the first hyp
-- on the grounds that it has been "used".]
We use the first equality hypothesis by substituting
(AP X (AP W (CONS Z X))) for (AP Y (CONS Z X)) and throwing away the
equality. We would thus like to prove:
(IMPLIES (EQUAL (AP X (CONS D (AP W X)))
(AP Y X))
(EQUAL (AP X (CONS D (AP W (CONS Z X))))
(AP X (AP W (CONS Z X))))).
[The above formula is invalid. Of course, the theorem-prover doesn't
know that and so it proceeds to generalize it still further before
going into induction on it.]
We will try to prove the above formula by generalizing it, replacing
(AP W (CONS Z X)) by A. We restrict the new variable by recalling the type
restriction lemma noted when AP was introduced. This produces:
(IMPLIES (AND (LISTP A)
(EQUAL (AP X (CONS D (AP W X)))
(AP Y X)))
(EQUAL (AP X (CONS D A)) (AP X A))).
Give the above formula the name *1.1.3.
Bob and J
∂14-Jun-82 0922 CL.BOYER at UTEXAS-20 halting problem
Date: Monday, 14 June 1982 11:18-CDT
From: CL.BOYER at UTEXAS-20
To: jmc at su-ai, clt at su-ai, atp.bledsoe at UTEXAS-20
Subject: halting problem
You might be interested in [utexas-20]<cl.boyer>undec.doc,
which is a preliminary version of a paper about a mechanical
proof of the unsolvability of the halting problem.
Bob & J
Notes to send
About MC.FLATTEN example in CHII
Explanation of evaporation with out comment of basecase of
second induction
(IMPLIES (AND (LISTP U)(LISTP V))
(EQUAL (APPEND U (APPEND V A))(APPEND (APPEND U V) A)))
might be helpful
SL problem
Why not tc(WFN) in initial shell principle
Why one Bottom
Why not work harder to obtain ∃ witnesses?
LEFT.HACK
NOTE.FILE does redo do knowledge base contrary to advertisement?
% ____________________________________________________________________ %